منابع مشابه
Latent semantics in language models
This paper investigates three different sources of information and their integration into language modelling. Global semantics is modelled by Latent Dirichlet Allocation and brings long range dependencies into language models. Word clusters given by semantic spaces enrich these language models with short range semantics. Finally, our own stemming algorithm is used to further enhance the perform...
متن کاملLatent words recurrent neural network language models
This paper proposes a novel language modeling approach called latent word recurrent neural network language model, which solves the problems present in both recurrent neural network language models (RNNLMs) and latent word language models (LWLMs). The proposed model has a soft class structure based on a latent variable space as well as LWLM, where the latent variable space is modeled using RNNL...
متن کاملA Latent Variable Recurrent Neural Network for Discourse Relation Language Models
This paper presents a novel latent variable recurrent neural network architecture for jointly modeling sequences of words and (possibly latent) discourse relations that link adjacent sentences. A recurrent neural network generates individual words, thus reaping the benefits of discriminatively-trained vector representations. The discourse relations are represented with a latent variable, which ...
متن کاملMixture of latent words language models for domain adaptation
This paper introduces a novel language model (LM) adaptation method based on mixture of latent word language models (LWLMs). LMs are often constructed as mixture of n-gram models, whose mixture weights are optimized using target domain data. However, n-gram mixture modeling is not flexible enough for domain adaptation because model merger is conducted on the observed word space. Since the words...
متن کاملExplicit vs. Latent Concept Models for Cross-Language Information Retrieval
The field of information retrieval and text manipulation (classification, clustering) still strives for models allowing semantic information to be folded in to improve performance with respect to standard bag-of-word based models. Many approaches aim at a concept-based retrieval, but differ in the nature of the concepts, which range from linguistic concepts as defined in lexical resources such ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings of the AAAI Conference on Artificial Intelligence
سال: 2020
ISSN: 2374-3468,2159-5399
DOI: 10.1609/aaai.v34i05.6298